Feedforward architectures driven by inhibitory interactions
نویسندگان
چکیده
منابع مشابه
Neural correlation is stimulus modulated by feedforward inhibitory circuitry.
Correlated variability of neural spiking activity has important consequences for signal processing. How incoming sensory signals shape correlations of population responses remains unclear. Cross-correlations between spiking of different neurons may be particularly consequential in sparsely firing neural populations such as those found in layer 2/3 of sensory cortex. In rat whisker barrel cortex...
متن کاملPipelined Radix-2 Feedforward FFT Architectures
The appearance of radix-2 was a milestone in the design of pipelined FFT hardware architectures. Later, radix-2 was extended to radix-2. However, radix-2 was only proposed for single-path delay feedback (SDF) architectures, but not for feedforward ones, also called multi-path delay commutator (MDC). This paper presents the radix-2 feedforward (MDC) FFT architectures. In feedforward architecture...
متن کاملGeneralized neuron: Feedforward and recurrent architectures
Feedforward neural networks such as multilayer perceptrons (MLP) and recurrent neural networks are widely used for pattern classification, nonlinear function approximation, density estimation and time series prediction. A large number of neurons are usually required to perform these tasks accurately, which makes the MLPs less attractive for computational implementations on resource constrained ...
متن کاملSymmetries and discriminability in feedforward network architectures
This paper investigates the effects of introducing symmetries into feedforward neural networks in what are termed symmetry networks. This technique allows more efficient training for problems in which we require the output of a network to be invariant under a set of transformations of the input. The particular problem of graph recognition is considered. In this case the network is designed to d...
متن کاملFeedforward Approximations to Dynamic Recurrent Network Architectures
Recurrent neural network architectures can have useful computational properties, with complex temporal dynamics and input-sensitive attractor states. However, evaluation of recurrent dynamic architectures requires solving systems of differential equations, and the number of evaluations required to determine their response to a given input can vary with the input or can be indeterminate altogeth...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Journal of Computational Neuroscience
سال: 2017
ISSN: 0929-5313,1573-6873
DOI: 10.1007/s10827-017-0669-1